11 research outputs found

    Opinion mining summarization and automation process a survey

    Get PDF
    In this modern age, the internet is a powerful source of information. Roughly, one-third of the world population spends a significant amount of their time and money on surfing the internet. In every field of life, people are gaining vast information from it such as learning, amusement, communication, shopping, etc. For this purpose, users tend to exploit websites and provide their remarks or views on any product, service, event, etc. based on their experience that might be useful for other users. In this manner, a huge amount of feedback in the form of textual data is composed of those webs, and this data can be explored, evaluated and controlled for the decision-making process. Opinion Mining (OM) is a type of Natural Language Processing (NLP) and extraction of the theme or idea from the user's opinions in the form of positive, negative and neutral comments. Therefore, researchers try to present information in the form of a summary that would be useful for different users. Hence, the research community has generated automatic summaries from the 1950s until now, and these automation processes are divided into two categories, which is abstractive and extractive methods. This paper presents an overview of the useful methods in OM and explains the idea about OM regarding summarization and its automation process

    An e-learning system in Malaysia based on green computing and energy level

    Get PDF
    The increasing of energy cost and also environmental concern on green computing gaining more and more attention. Power and energy are a primary concern in the design and implementing green computing. Green is of the main step to make the computing world friendly with the environment. In this paper, an analysis on the comparison of green computer with other computing in E-learning environment had been done. The results show that green computing is friendly and less energy consuming. Therefore, this paper provides some suggestions in overcoming one of main challenging problems in environment problems which need to convert normally computing into green computing. In this paper also, we try to find out some specific area which consumes energy as compared to green computing in E –learning centre in Malaysia. The simulation results show that more than 30% of energy reduction by using green computing

    Measuring the BDARX architecture by agent oriented system a case study

    Get PDF
    Distributed systems are progressively designed as multi-agent systems that are helpful in designing high strength complex industrial software. Recently, distributed systems cooperative applications are openly access, dynamic and large scales. Nowadays, it hardly seems necessary to emphasis on the potential of decentralized software solutions. This is because the main benefit lies in the distributed nature of information, resources and action. On the other hand, the progression in multi agent systems creates new challenges to the traditional methodologies of fault-tolerance that typically relies on centralized and offline solution. Research on multi-agent systems had gained attention for designing software that operates in distributed and open environments, such as the Internet. DARX (Dynamic Agent Replication eXtension) is one of the architecture which aimed at building reliable software that would prove to be both flexible and scalable and also aimed to provide adaptive fault tolerance by using dynamic replication methodologies. Therefore, the enhancement of DARX known as BDARX can provide dynamic solution of byzantine faults for the agent based systems that embedded DARX. The BDARX architecture improves the fault tolerance ability of multi-agent systems in long run and strengthens the software to be more robust against such arbitrary faults. The BDARX provide the solution for the Byzantine fault tolerance in DARX by making replicas on the both sides of communication agents by using BFT protocol for agent systems instead of making replicas only on server end and assuming client as failure free. This paper shows that the dynamic behaviour of agents avoid us from making discrimination between server and client replicas

    Real-Time Wheat Classification System for Selective Herbicides Using Broad Wheat Estimation in Deep Neural Network

    Get PDF
    Identifying seed manually in agriculture takes a long time for practical applications. Therefore, an automatic and reliable plant seeds identification is effectively, technically and economically importance in agricultural industry. In addition, the current trend on big data and data analysis had introduced scientist with many opportunities to use data mining techniques for better decision in various application. Recently, there are various number of applications that use computer-aided in improving the quality in controlling system. Classifying different types of wheat hold significant and important role in agriculture field. An improvement on such kind of system that makes distinctions based on shape color and texture of wheat plantation is crucial. The main objective of this paper is to develop a machine vision system which identifies wheat base on its location. For this purpose, a real time robotics system is developed in order to find plant in sorrowing area using pattern recognition and machine vision. For real-time and specific herbicide applications, the images are categorized in either expansive or precise categories via algorithm following the principal of morphological operation. Different experiments were conducted in order to gauge the efficiency of the proposed algorithm in terms of distinguishing between various types of wheats. Furthermore, the experiments also performed admirably amid varying field conditions. The simulation results show that the proposed algorithms exhibited 94% success rate in terms of categorizing wheat population which consists of 80 samples and out of them 40 are narrow and 40 broad

    Search Engine Optimization Algorithms for Page Ranking: Comparative Study

    Get PDF
    Every second, the number of visitors increase day by day due to the fast growing of World Wide Web. Till this day there are more than 11.3 billion web pages in the World Wide Web. In the modern era of technology and advance computation, world page ranking become a common feature of modern retrieval system. However, any query in search engine will display both relevant and irrelevant data that can cause overhead to the search engine and will affect the page ranking process. A new optimization technique is needed to improve the existing search engine optimization in increasing the page ranking. This paper presents a comparative study of different page ranking algorithms for search engine optimization. Also it explores some improvements that are needed to overcome the current problem in this field. The simulation result’s analysis clearly shows that there is a need of new optimization technique. This new technique must reduce the complexity and user overhead by displaying only related data which will reduce overheading in search engine

    A new biometric matching fingerprints system for personal authentication using R305

    Get PDF
    Now days, security is the main concerning research area for public and private sectors. Security become a major issue in many of our daily life places such as offices, libraries, hospitals, houses, laboratories, educational institutes, military areas etc where in these places, our privacy and confidentiality are very high. A number of methods used to solve this issue in different ways. One of the methods used for authentication of any individual to desired place entrance is door lock security system. This paper implements R305 which is a fingerprint based system design that that used security system for authentication on fingerprint process along its unique patterns. Furthermore, Arduino UNO device is used for this physical security by fingerprints. This project will become a prototype of biometric system since each fingerprint has a unique pattern and cannot be stolen, shared or any type of rudeness. The effectiveness of the proposed system has been verified and the results show that 95% accuracy with matching results on selected datasets

    Comparative Analysis for Heart Disease Prediction

    No full text
    Today, heart diseases have become one of the leading causes of deaths in nationwide. The best prevention for this disease is to have an early system that can predict the early symptoms which can save more life. Recently research in data mining had gained a lot of attention and had been used in different kind of applications including in medical. The use of data mining techniques can help researchers in predicting the probability of getting heart diseases among susceptible patients. Among prior studies, several researchers articulated their efforts for finding a best possible technique for heart disease prediction model. This study aims to draw a comparison among different algorithms used to predict heart diseases. The results of this paper will helps towards developing an understanding of the recent methodologies used for heart disease prediction models. This paper presents analysis results of significant data mining techniques that can be used in developing highly accurate and efficient prediction model which will help doctors in reducing the number of deaths cause by heart disease

    The impact of search engine optimization on the visibility of research paper and citations

    Get PDF
    The initial criteria for evaluating a researcher's output is the number of papers published. Furthermore, for the measurement of author's research quality, the number of citations is significant. Typically, citations are directly linked with the visibility of a research paper. Many researches had shown that the visibility of a research paper can be improved further by using the search engine optimization techniques. In addition, some research already proved that the visibility of an article could improve the citation results. In this article, we analysed the impact of search engine optimization techniques that can improve the visibility of a research paper. Furthermore, this paper also proposing some strategies that can help and making the research publication visible to a large number of users

    The impact of search engine optimization on the visibility of research paper and citations

    No full text
    The initial criteria for evaluating a researcher's output is the number of papers published. Furthermore, for the measurement of author's research quality, the number of citations is significant. Typically, citations are directly linked with the visibility of a research paper. Many researches had shown that the visibility of a research paper can be improved further by using the search engine optimization techniques. In addition, some research already proved that the visibility of an article could improve the citation results. In this article, we analysed the impact of search engine optimization techniques that can improve the visibility of a research paper. Furthermore, this paper also proposing some strategies that can help and making the research publication visible to a large number of user
    corecore